Technology Tales

Adventures & experiences in contemporary technology

A rake of new cameras

30th August 2007

Nikon D300The websites of Amateur Photographer and Tech.co.uk are good places to find out what is happening in the world of digital cameras, which is just as well given the recent camera launching frenzy. It all seemed to start off with Canon’s EOS D40 and EOS 1Ds Mark III. The former seemed to be a case of playing catch up but I still think that Canon should have used the opportunity to pull ahead, at least in the megapixel stakes; Sony is working on a 12 megapixel offering and could be about cause of 12 megapixel sensors becoming the norm for consumer digital SLR’s like they did with the 10 megapixel level. I realise that megapixels aren’t everything but it has seemed to go like that thus far. Playing catch up doesn’t apply to the 1Ds Mk III with its having a monster sensor resolution of 21 megapixels and, needless to say, the improvements to the favoured DSLR of landscape photographers don’t stop there. Nikon were also in the fray with new 12 MP offerings: D300 for the enthusiast and D3 for the professional. The sensor in the latter interestingly features a sensor that sits between full frame and the more usual APS-C sizes. Panasonic have also announced a new DSLR but a number of manufacturers have new digital compacts on the market too. All of the previous makers have something as does Olympus. It was amazing to see this all happening at once but I suppose that’s how it goes. IFA has been on over the last week but some launches preceded this; it’s usually something big like Photokina that results in this sort of thing…

Update: I have discovered since the Nikon’s D3 has a sensor sized in the full frame domain. It might be 36mmx23.9mm rather than 36mmx24mm but the FX format comes extremely close and the advent of full frame DSLR’s being purveyed by a number of manufacturers may be upon us.

More Computing Equipment

12th January 2009

More Computing Equipment

While I lived in Edinburgh, I largely stuck with local PC part resellers such as Ideal Computing or Silicon Group for my PC building needs. Since all my purchases had to be paid for in cash due to my not having credit or debit cards in those days before the credit splurge that caused subsequent economic problems, that was just as well and was sufficient for my needs. Luckily, they were simpler at the time.

My move south to Macclesfield meant that the counterparts to those stores that I frequented in Edinburgh were not the same. Nevertheless, I found one in Stockport and another in Heaton Chapel that gave me the service that I needed for as long as they lasted. The first was away from Stockport’s shopping precinct and supplied me with a full tower case and an AMD CPU before it closed. The second was part of the now defunct MicroDirect and was conveniently near a train station, so a PC case, motherboard, USB drive housing and WD 500 GB hard drive all came from there before financial trouble struck during the Great Recession. Restructuring allowed the Manchester store to stay open before it, too, shut its doors during 2014, taking the website operation with it. If I find a replacement for either of these, I might be tempted to give it a try.

Another thing that moving from Edinburgh brought my way was working for a living so I now could get debit and credit cards when I could not before then. That meant that online shopping became more of a possibility. As ever, delivery arrangements are not the most convenient with the need for traipsing around the country to courier depots and I don’t fancy annoying neighbours with my deliveries either. However, my current job allows for working from home and this does help, but the sight of Saturday and evening delivery still retains its attraction even if this is a more expensive option.

Over 20 years of making purchases does have you encountering a few computer equipment resellers and many of those companies listed below have seen some business from me from time to time. My being easy to please may mean that I rarely have cause for complaint with any of the ones with which I have had dealings apart from delivery inconveniences. The list should be a living one and economic conditions have taken their toll and may do so again. That will mean changes over time so we’ll need to see how suppliers fare.

Argos

This surely has to be a strange entry to have at the top of this list yet they seem to have a greater range of laptop computers than Currys! My HP Pavilion dm4 came from one of their stores and it has been a successful purchase too. Otherwise, various items such as mobile broadband modems and even an external Seagate 2 TB hard drive have been acquired from them. When it comes to computing hardware, it seems that all that’s missing are PC components such as internal hard drives. It’s amazing how mainstream computing has become these days.

Box

This West Midlands only recently came to my attention due to their Cube PC’s. There is a wider range of computer goods that include desktop machines from other manufacturers and the range of laptops is extensive, yet their range also includes TV and audio devices as well. The company has been around since 1996, so there is a track record too.

CCL Computers

It seems that PC Pro readers like this Yorkshire company a lot and I once had a colleague at work who swore by them too. There was a time when I ordered an AMD Athlon CPU from them and needed to return it when it didn’t work as I had hoped. Then, the service was what more should emulate an efficient order fulfilment has continued into recent times too. More recent items have included a 2 GB Western Digital hard disk and a Zalman ZM450-GS 450W power supply. Each did what was expected of them so I have no complaints.

Currys

PC World was a pervasive name for so long until the holding company consolidated everything under the Currys name in much of the UK away from airports. The list of what I have purchased from their stores in Edinburgh, Stockport, Manchester and now Macclesfield over the decades rather shocks me. Thinking about now, items bought there have included a Toshiba laptop bought in a January sale, an Epson printer and a now retired Canon scanner. Evening opening has ensured that an actual store can become a source of emergency purchases for those who need to be at a workplace during the working day and that’s how it has been for me on a number of occasions like when a power supply has failed.

Novatech

It was September 1997 when I made my first purchase from this long-established reseller. That was 16 MB of RAM for a Dell XPS 133 and it was not the last item that has come from them either. The attraction then was the ability to pay by cheque for any goods obtained by mail order and I think that DABS must have offered a similar arrangement since I ordered PC parts from them too. In those days, I was without a credit or debit card, so internet shopping was not so convenient and that trend has intensified since then.

More recently, one of my reasons for turning to them has been to get tested and pre-assembled bundles for system upgrades. One was a Gigabyte Z87-HD3 motherboard that came with an Intel Core i5 4670K CPU and 8 GB of DDR3 1600 MHz RAM installed on it and the whole unit tested. It worked without any problems at all and that is more than can be said for some of the system upgrades that I have tried: 2001 was blighted by a destructive ASUS motherboard that wrecked AMD Athlon CPU’s and an IBM Deskstar hard drive; 2009 was disrupted by a dead Gigabyte mainboard before I turned to a bare-bones system from Novatech. That whole unit appeared to have been sourced from Foxconn and had one of their A6VMX motherboards along with an AMD Athlon X2 7820 dual-core processor and 2 GB of DDR2 400 MHz RAM. More memory was added to get 4 GB in there and hard drives and a DVD writer were installed to gain a working main PC after a few months of making do with other machines. My backup machine not has a Gigabyte H81 mainboard with an Intel Core i5 4570 CPU and 8 GB of RAM, which came as a pre-tested bundle and also worked without a problem.

There was a time when I needed to test out Novatech’s returns policy too with an order for what proved to be incompatible memory and they did the needful worked without any problem too. Other more mundane purchases have included 2 GB and 8 GB USB drives and there was nothing amiss with those. All in all, I’d continue to give Novatech some custom.

Quiet PC

As the name suggests, these are people who are concerned with providing quieter PC hardware and that includes components as well as whole PC’s too.

Limiting Google Drive upload & synchronisation speeds using Trickle

9th October 2021

Having had a mishap that lost me some photos in the early days of my dalliance with digital photography, I have been far more careful since then and that now applies to other files as well. Doing regular backups is a must that you find reiterated by many different authors and the current computing climate makes doing that more vital than it ever was.

So, as well as having various local backups, I also have remote ones in the form of OneDrive, Dropbox and Google Drive. These more correctly are file synchronisation services but disciplined use can make them useful as additional storage facilities in the interests of maintaining added resilience. There also are dedicated backup services that I have seen reviewed in the likes of PC Pro magazine but I have to make use of those.

Insync

Part of my process for dealing with new digital photo files is to back them up to Google Drive and I did that with a Windows client in the early days but then moved to Insync running on Linux Mint. One drawback to the approach is that this hogs the upload bandwidth of an internet connection that has yet to move to fibre from copper cabling. Having fibre connections to a local cabinet helps but a 100 KiB/s upload speed is easily overwhelmed and digital photo file sizes keep increasing. It does not help that I insist on using more flexible raw formats like DNG, CR2 or CR3 either.

Making fewer images could help to cut the load but I still come away from an excursion with many files because I get so besotted with my surroundings. This means that upload sessions take numerous hours and can extend across calendar days. Ultimately, this makes my internet connection far less usable so I want to throttle upload speed much like what is possible in the Transmission BitTorrent client or in the Dropbox client. Unfortunately, this is not available in Insync so I have tried using the trickle command instead and an example is below:

trickle -d 2000 -u 50 insync

Here, the upload speed is limited to 50 KiB/s while the download speed is limited to 2000 KiB/s. In my case, the latter of these hardly matters while the former leaves me with acceptable internet usability. Insync does not work smoothly with this, however, so occasional restarts are needed to keep file uploads progressing and CPU load also is higher. As rough as the user experience feels, uploads can continue in parallel with other work.

gdrive

One other option that I am exploring is the use of the command-line tool gdrive and this appears to work well with trickle. After downloading and installing the tool, getting going is a matter of issuing the following command and following the instructions:

gdrive about

On web servers, I even have the tool backing up things to Google Drive on a scheduled basis. Because of a Google Drive limitation that I have encountered not only with gdrive but also with Insync and Google’s own Windows Google Drive client, synchronisation only can happen with two new folders, one local and the other remote. Handily, gdrive supports the usual bash style commands for working with remote directories so something like the following will create a directory on Google Drive:

gdrive mkdir ttdc [ID for parent folder]

Here, the ID for the parent folder may be omitted but it can be obtained by going to Google Drive online and getting a link location by right-clicking on a folder and choosing the appropriate context menu item. This gets you something like the following and the required identifier is found between the last slash and the first question mark in the address string (so as not to share any real links, I made the address more general below):

https://drive.google.com/drive/folders/[remote folder ID]?usp=sharing

Then, synchronisation uses a command like the following:

gdrive sync upload [local folder or file path] [remote folder ID]

There also is the option to do a one-way upload and this is the form of the command used:

gdrive upload [local folder or file path] -p [remote folder ID]

Because every file or folder object has its own ID on Google Drive, it is possible to create two objects on there that appear to have the same name though that is sure to cause confusion even if you know what is happening. It is possible in each of the above to throttle them using trickle as well:

trickle -d 2000 -u 50 gdrive sync upload [local folder or file path] [remote folder ID]
trickle -d 2000 -u 50 gdrive upload [local folder or file path] -p [remote folder ID]

Handily, this works without the added drama seen with Insync and lends itself to scripting as well so it could be something that I will incorporate into my current workflow. One thing that needs to be watched is file upload failures but there may be ways to catch those and retry them so that would another thing that needs doing. This is built into Insync and it would be a learning opportunity if I was to stick with gdrive instead.

Moving a website from shared hosting to a virtual private server

24th November 2018

This year has seen some optimisation being applied to my web presences guided by the results of GTMetrix scans. It was then that I realised how slow things were, so server loads were reduced. Anything that slowed response times, such as WordPress plugins, got removed. Usage of Matomo also was curtailed in favour of Google Analytics while HTML, CSS and JS minification followed. What had yet to happen was a search for a faster server. Now, another website has been moved onto a virtual private server (VPS) to see how that would go.

Speed was not the only consideration since security was a factor too. After all, a VPS is more locked away from other users than a folder on a shared server. There also is the added sense of control, so Let’s Encrypt SSL certificates can be added using the Electronic Frontier Foundation’s Certbot. That avoids the expense of using an SSL certificate provided through my shared hosting provider and a successful transition for my travel website may mean that this one undergoes the same move.

For the VPS, I chose Ubuntu 18.04 as its operating system and it came with the LAMP stack already in place. Have offload development websites, the mix of Apache, MySQL and PHP is more familiar to me than anything using Nginx or Python. It also means that .htaccess files become more useful than they were on my previous Nginx-based platform. Having full access to the operating system by means of SSH helps too and should mean that I have fewer calls on technical support since I can do more for myself. Any extra tinkering should not affect others either, since this type of setup is well known to me and having an offline counterpart means that anything riskier is tried there beforehand.

Naturally, there were niggles to overcome with the move. The first to fix was to make the MySQL instance accept calls from outside the server so that I could migrate data there from elsewhere and I even got my shared hosting setup to start using the new database to see what performance boost it might give. To make all this happen, I first found the location of the relevant my.cnf configuration file using the following command:

find / -name my.cnf

Once I had the right file, I commented out the following line that it contained and restarted the database service afterwards using another command to stop the appearance of any error 111 messages:

bind-address 127.0.0.1
service mysql restart

After that, things worked as required and I moved onto another matter: uploading the requisite files. That meant installing an FTP server so I chose proftpd since I knew that well from previous tinkering. Once that was in place, file transfer commenced.

When that was done, I could do some testing to see if I had an active web server that loaded the website. Along the way, I also instated some Apache modules like mod-rewrite using the a2enmod command, restarting Apache each time I enabled another module.

Then, I discovered that Textpattern needed php-7.2-xml installed, so the following command was executed to do this:

apt install php7.2-xml

Then, the following line was uncommented in the correct php.ini configuration file that I found using the same method as that described already for the my.cnf configuration and that was followed by yet another Apache restart:

extension=php_xmlrpc.dll

Addressing the above issues yielded enough success for me to change the IP address in my Cloudflare dashboard so it pointed at the VPS and not the shared server. The changeover happened seamlessly without having to await DNS updates as once would have been the case. It had the added advantage of making both WordPress and Textpattern work fully.

With everything working to my satisfaction, I then followed the instructions on Certbot to set up my new Let’s Encrypt SSL certificate. Aside from a tweak to a configuration file and another Apache restart, the process was more automated than I had expected so I was ready to embark on some fine-tuning to embed the new security arrangements. That meant updating .htaccess files and Textpattern has its own, so the following addition was needed there:

RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

This complemented what was already in the main .htaccess file and WordPress allows you to include http(s) in the address it uses, so that was another task completed. The general .htaccess only needed the following lines to be added:

RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.assortedexplorations.com/$1 [R,L]

What all these achieve is to redirect insecure connections to secure ones for every visitor to the website. After that, internal hyperlinks without https needed updating along with any forms so that a padlock sign could be shown for all pages.

With the main work completed, it was time to sort out a lingering niggle regarding the appearance of an FTP login page every time a WordPress installation or update was requested. The main solution was to make the web server account the owner of the files and directories, but the following line was added to wp-config.php as part of the fix even if it probably is not necessary:

define('FS_METHOD', 'direct');

There also was the non-operation of WP Cron and that was addressed using WP-CLI and a script from Bjorn Johansen. To make double sure of its effectiveness, the following was added to wp-config.php to turn off the usual WP-Cron behaviour:

define('DISABLE_WP_CRON', true);

Intriguingly, WP-CLI offers a long list of possible commands that are worth investigating. A few have been examined but more await attention.

Before those, I still need to get my new VPS to send emails. So far, sendmail has been installed, the hostname changed from localhost and the server restarted. More investigations are needed but what I have not is faster than what was there before, so the effort has been rewarded already.

The wonders of mod_rewrite

24th June 2007

When I wrote about tidying dynamic URL’s a little while back, I had no inkling that that would be a second part to the tale. My discovery of mod_rewrite, an Apache module that facilitates URL translation. The effect is that one URL is presented to the user in the browser address bar, and the exact same URL is also seen by search engines, while another is passed to the server for processing. It might sound like subterfuge but it works very well once you manage to get it set up properly. The web host for my hillwalking blog/photo gallery has everything configured such it is ready to go but the same did not apply to the offline Apache 2.2.x server that I have going on my own Windows XP box. There were two parts to getting it working there:

  1. Activating mod-rewrite on the server: this is as easy as uncommenting a line in the httpd.conf file for the site (the line in question is: LoadModule rewrite_module modules/mod_rewrite.so).
  2. Ensuring that the .htaccess file in the root of the web server directory is active. You need to set the values of the AllowOverride directives for the server root and CGI directories to All so that .htaccess is active. Not doing it for the latter will result in the an error beginning with the following: Options FollowSymLinks or SymLinksIfOwnerMatch is off which implies that RewriteRule directive is forbidden. Having Allow from All set for the required directories is another option to consider when you see errors like that.

Once you have got the above sorted, adding this line to .htaccess: RewriteEngine On. Preceding it with an Options directive to ensure that FollowSymLinks and SymLinksIfOwnerMatch are switched on does no harm at all and may even be needed to get things running. That done, you can set about putting mod_write to work with lines like this:

RewriteRule ^pages/(.*)/?$ pages.php?query=$1

The effect of this is to take http://www.website.com/pages/input and convert it into a form for action by the server; in this case, that is http://www.website.com/pages.php?query=input. Anything contained by a bracket is assigned to the value of a system-named variable. If you have several bracketed sections, they are assigned to sequentially numbered variables as follows: $1 for the first, $2 for the second and so on. It’s all good stuff when you get it going and not only does it make things look much neater but it also possesses an advantage when it comes to future-proofing too. Web addresses can be kept constant over time, even if things change behind the scenes. It means that returning visitors will find what they saw the last time that they visited and surely must ensure good karma in eyes of those all important search engines.

Adding a new domain or subdomain to an SSL certificate using Certbot

11th June 2019

On checking the Site Health page of a WordPress blog, I saw errors that pointed to a problem with its SSL set up. The www subdomain was not included in the site’s certificate and was causing PHP errors as a result though they had no major effect on what visitors saw. Still, it was best to get rid of them so I needed to update the certificate as needed. Execution of a command like the following did the job:

sudo certbot --expand -d existing.com,www.example.com

Using a Let’s Encrypt certificate meant that I could use the certbot command since that already was installed on the server. The --expand and -d switches ensured that the listed domains were added to the certificate to sort out the observed problem. In the above, a dummy domain name is used but this was replaced by the real one to produce the desired effect and make things as they should have been.

Using .htaccess to control hotlinking

10th October 2020

There are times when blogs cease to exist and the only place to find the content is on the Wayback Machine. Even then, it is in danger of being lost completely. One such example is the subject of this post.

Though this website makes use of the facilities of Cloudflare for various functions that include the blocking of image hotlinking, the same outcome can be achieved using .htaccess files on Apache web servers. It may work on Nginx to a point too but there are other configuration files that ought to be updated instead of using a .htaccess when some frown upon the approach. In any case, the lines that need adding to .htaccess are listed below though the web address needs to include your own domain in place of the dummy example provided:

RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?yourdomain.com(/)?.*$ [NC]
RewriteRule .*\.(gif|jpe?g|png|bmp)$ [F,NC]

The first line turns on the mod_rewrite engine and you may have that done anyway. Of course, the module needs enabling in your Apache configuration for this to work and you have to be allowed to perform the required action as well. This means changing the Apache configuration files. The next pair of lines look at the HTTP referer strings and the third one only allows images to be served from your own web domain and not others. To add more, you need to copy the third line and change the web address accordingly. Any new lines need to precede the last line that defines the file extensions that are to be blocked to other web addresses.

RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?yourdomain.com(/)?.*$ [NC]
RewriteRule \.(gif|jpe?g|png|bmp)$ /images/image.gif [L,NC]

Another variant of the previous code involves changing the last line to display a default image showing others what is happening. That may not reduce the bandwidth usage as much as complete blocking but it may be useful for telling others what is happening.

Photography Kit

7th July 2008

Photography Kit

This is a list that I want to build up over time and I am going to limit it to the U.K. for now. As should be apparent from any commentary that I have included, I have dealt with a few of the retailers that are listed below so I hope that it comes in useful.

7dayshop.com

My biggest purchase from this Guernsey-based lot was a Canon EOS 10D body that heralded the start of my journey into the world of digital photography at the beginning of 2005. There was a time when I was wont to buy film from them too, along with other bits and pieces but I then turned to Mailshots in Stoke-on-Trent for similar pricing and quicker delivery; it often took weeks for things to arrive from Guernsey after purchase.

Ace Optics

Cameraworld

Ffordes

Prior to my entry into the world of digital photography, this lot became a port of call for several pre-owned film cameras. A Minolta X-700 came from there in 2002 as did compatible Sigma lenses and a flash gun. During 2004, I traded in my Canon EOS 300 for an EOS 30 that they had on sale and an EOS 50E was acquired as a second body. A piece of fooling resulting from a lapse of concentration while on a visit to Harris in August has meant that the 50E has been pressed into service as my main film camera on any outings; it’s always good to have a spare and prices these days are more tempting than when I was buying second-hand equipment.

Jessops

This is a name in photographic retailing that has been brought back from the dead. Before its collapse, it was the major retailer in Britain’s town centres and there was a branch in Macclesfield. However, the focus is more on online sales now with there only being a small network of city centre stores like the one on Market Street in Manchester. Having Jessops back is no bad thing and I wish them well for it was at a branch in Stockport that I bought my first-ever SLR, a Canon EOS 300, in July 2001. Purchases of Sigma lenses followed: a 70-300 mm one in Stockport and a 28-135 mm in Manchester. Admittedly, the latter of these saw more use than the former, but that always happens to me: I seem to be a one body, one lens man most of the time and it is only the prospect of a lost in quality that seems to keep me away from using super-zoom lenses.

London Camera Exchange

Mifsuds

Park Cameras

It seems to have been Sigma lenses for my Pentax DSLR’s that I have been buying from these people. The first was an 18-125 mm offering that is the main one that I use and next came a 50-200 mm one that extends my photographic range further into the telephoto region. That I made the second purchase from them may surprise some given that there was a lengthy wait for the first one but I may have asked for a less common item and I allowed for this. The 50-200 mm lens was a far more timely arrival and there may be more purchases from them yet, subject to my actually having a need to do so.

Picstop

A card reader and SD cards have been what makes up the custom that I have given this bunch. Delivery from the Isle of Man is quicker than from Jersey but you do incur additional charges even if you get that for which you are paying.

SRS Microsystems

Wex Photo Video

Formerly known as Warehouse Express, this operation has occasionally tempted me with promising goods at appealing prices. In the early days, a Sekonic light meter came from them but they now are a first port of call when pondering the prospect of a photographic purchase. Various cameras, lenses, filters and bags have been sourced there over the years.

Getting Eclipse to start without incompatibility errors on Linux Mint 19.1

12th June 2019

Recent curiosity about Java programming and Groovy scripting got me trying to start up the Eclipse IDE that I had install on my main machine. What I got instead of a successful application startup was a message that included the following:

!MESSAGE Exception launching the Eclipse Platform:
!STACK
java.lang.ClassNotFoundException: org.eclipse.core.runtime.adaptor.EclipseStarter
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:466)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:566)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:499)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:626)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:584)
at org.eclipse.equinox.launcher.Main.run(Main.java:1438)
at org.eclipse.equinox.launcher.Main.main(Main.java:1414)

The cause was a mismatch between Eclipse and the installed version of Java that it needed in order to run. After all, the software itself is written in the Java language and the installed version from the usual software repositories was too old for Java 11. The solution turned out to be installing a newer version as a Snap (Ubuntu’s answer to Flatpak). The following command did the needful since Snapd already was running on my machine:

sudo snap install eclipse --classic

The only part of the command that warrants extra comment is the --classic switch since that is needed for a tool like Eclipse that needs to access a host file system. On executing, the software was downloaded from Snapcraft and then installed within its own bundle of dependencies. The latter adds a certain detachment from the underlying Linux installation and ensures that no messages appear because of incompatibilities like the one near the start of this post.

Databases & Programming

29th September 2012

The world of UNIX appears to attract those interested in the more technical aspects of computing. Since Linux is cut from the same lineage, it is apt to include lists of computing languages. Both scripting and programming appear here despite the title, itself shortened for the sake of brevity. Since much code cutting involves working with databases, these appear here too.

In time, I plan to correct the imbalance between programming and scripting languages that currently exists. The original list was bare, so descriptions have been added and will be more and more needed should there be any expansion of what you find here.

Programming and Scripting Languages

Apache Groovy

My first encounter with an implementation of this language was with that belonging to a statistical computing environment (SCE) and that remains an ongoing dalliance. It is easy to think of Groovy as a way of working with a Java-based API using a scripting language and it certainly feels like that. Saying that, it all works better if you know Java, though you do have to watch for the development of domain-specific language capability. That last comment probably applies to the aforementioned SCE in that it has its own object and method hierarchy that means that not all standard Groovy functionality is available.

Clojure

Clojure is a dynamic, functional programming language that runs on the Java Virtual Machine (JVM) and is designed for building robust and scalable software applications. It is characterised by its emphasis on immutability, persistent data structures, and seamless interoperability with Java. Clojure embraces the Lisp programming language’s principles, providing a concise syntax and powerful abstractions for managing state, concurrency, and functional programming paradigms. With its focus on simplicity, expressiveness, and the ability to leverage the vast Java ecosystem, Clojure enables developers to create efficient and maintainable code for a wide range of applications.

Erlang

This is a programming language designed for building highly concurrent, fault-tolerant, and scalable systems that was developed by Ericsson in the late 1980s for telecommunication systems, where reliability and performance are critical. Erlang incorporates features such as lightweight processes, message passing, and built-in support for fault tolerance, making it well-suited for developing distributed and real-time applications. Its unique concurrency model and emphasis on fault tolerance have led to its widespread use in industries such as telecommunications, banking, gaming, and web development, where systems need to handle high loads, be resilient to failures, and provide real-time responsiveness.

Elixir

Inspired by Erlang, Elixir is a functional, concurrent programming language designed for building scalable and fault-tolerant applications. It leverages the powerful concurrency model of the Erlang Virtual Machine (BEAM) while providing a more accessible and expressive syntax. It offers features such as lightweight processes, message passing, pattern matching, and a robust ecosystem of libraries and frameworks. With its focus on reliability, performance, and ease of development, Elixir is well-suited for developing highly concurrent and distributed systems, making it a popular choice for building web applications, real-time systems, and software that requires high availability.

Go

Computing languages often get strange names like single letters or small words like this one; that means that you need to look for “Golang” in any online search. In any case, Go was originated at Google and numbered among its inventors was one of the creators of the C programming language. The intent here is massively multithreaded system programming using stand-alone executable components while retaining or enhancing code readability. Another facet is the ability to function efficiently in distributed computing environments like those at SoundCloud or Uber. A variety of different tools have been written using the language and these include the ever pervasive Docker and Kubernetes.

Julia

It remains an odd decision to give a computing language a girl’s name, but the purpose is a serious one. Often, there is a trade-off between speed of code writing and speed of execution with the result being that data programming involves prototyping in one language and porting to another for production usage. The first group includes R and Python while the second includes C, C++, FORTRAN and even Java, so there is an element of translation involved that often means that different people are involved, which adds an element of error caused by misunderstandings. This gets described as the two language problem and Julia’s major raison d’être is the avoidance of that: its top-line description is that it is as quick to program as Python but runs as fast as C because of its just-in-time compilation, multiple dispatch and in-built multithreading. This also allows for extensive capabilities for scientific computing that go beyond machine learning and an example comes in the number of differential equation solvers that are available. It also helps that meta-programming makes everything more generalisable.

Perl

It has been around since the 1980’s and still pervades though it is not as dominant as it once was for creating dynamic websites or system administration. PHP has taken on much of the former while Python is making inroads into the latter. Still, no list would be complete with complete without a mention of the once ubiquitous scripting language and it once powered my online photo gallery. It may be an easier language, but there is plenty of documentation on the web with Perldoc, Perl Maven and Perlmeister being some good places to look, and Dan Massey has some interesting articles on his site too. Not only that, but it is extensible too, with plenty of extra modules to be found on CPAN.

PHP

This usurper has taken the place of Perl for powering many of the world’s websites. That the language is less verbose probably helps its case and many if not most CMS packages make use of its versatility.

Python

It may be Google’s preferred scripting language for system administration but it is its usefulness for Data Science where it really has shone in the eyes of many. There are numerous packages for data wrangling, data visualisation and machine learning that make the language ever present in any Data Scientist’s toolbox and looking in the PyPi archive will allow you to find what you need. It also has its place in web scripting too, even if it is not as pervasive as PHP though CMS’s like Plone run on Python and there is the Django framework together with the Gunicorn web server.

OpenJDK

One of the acts of Jonathon Schwartz while he was head at Sun Microsystems was to make Java open source after more than a decade of its being largely proprietary and this is the website for the project. Of course, his more notable act at Sun was to sell the company to Oracle, but that’s another story altogether…

R

This is an open-source implementation of the S language that is much appreciated by statisticians and is much used in the teaching of the subject. The base language only has so much functionality but there are many packages available that do just that and there are many to find on repositories like the CRAN and others can be found on various GitHub repositories, though these tend to be more experimental in nature. There are commonly used and well-supported mainstays that everyone uses, but there always is a need to verify that a particular package does what it claims to do. Given that, there are possibilities for data wrangling, data tabulation, data visualisation and data science. While quick to code, R is slow to execute compared with others and I have found that Python is faster but it still has a use for smaller data sets; both keep their temporary data sets in system memory so that will help.

Rust

It came as a surprise that this Mozilla-originated language is gaining traction in scientific data analysis, possibly because it is a fast multithreaded counterpart to C and C++ with some added safety features (though these can be turned off if needed and extra care gets taken). The downsizing of Mozilla led to a sharp reduction in its team of Rust developers and the Rust Foundation has been set up to oversee the language instead. There are online books like The Rust Programming Language and the Rust Cookbook, with the first of these also having paper and e-book counterparts from No Starch Press. For those interested in a more interactive introduction, there also is the Tour of Rust.

Databases

MariaDB

This essentially is a fork of MySQL (see below) now that Oracle owns it. The originators of MySQL are the creators of MariaDB so their claims of it being a drop-in replacement for it may have some traction. So far, I have seen no exodus from MySQL, though.

MySQL

After being in the hands of a number of owners until it incongruously came into the custodianship of Oracle (who of course already had and still have one of their own), the database system that powers many dynamic websites almost remains a de facto standard and looks set to remain thus for now.

MongoDB

This may a document-based and not a relationship database like many of us understand them but it still is being touted as an alternative to the more mainstream competition. Database technology isn’t just about SQL and MongoDB champions a NoSQL approach; it sounds as if the emergence of XML might be what’s facilitating the NoSQL database technologies.

PostgreSQL

This project may have more open-source credibility than MySQL, but it seems to remain in its shadow, though that may be explained by its being a more complex piece of software to use (at least, that has been my experience, anyway). It so happens that this is what Debian installs if you specify the web server option at operating system installation time.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.